An isotropic Gaussian mixture can have more modes than components
نویسندگان
چکیده
Carreira-Perpinan and Williams (2003) conjectured that a homoscedastic Gaussian mixture of M components in d 1 dimensions has at most M modes. Prof. J. J. Duistermaat (personal communication, 2003) provided the counterexample of a 3–component mixture in d = 2 where the Gaussians are located at the vertices of an equilateral triangle; for a certain range of variances modes are present near to the vertices and also at the centre of the triangle. In this paper we illustrate the nature of the counterexample and compute the range of variances for which there are more than 3 maxima. We also extend the construction to the regular simplex with M vertices and show that for M 2 there is always a range of variances for which M+1 modes are present.
منابع مشابه
On the Number of Modes of a Gaussian Mixture
We consider a problem intimately related to the creation of maxima under Gaussian blurring: the number of modes of a Gaussian mixture in D dimensions. To our knowledge, a general answer to this question is not known. We conjecture that if the components of the mixture have the same covariance matrix (or the same covariance matrix up to a scaling factor), then the number of modes cannot exceed t...
متن کاملIMAGE SEGMENTATION USING GAUSSIAN MIXTURE MODEL
Stochastic models such as mixture models, graphical models, Markov random fields and hidden Markov models have key role in probabilistic data analysis. In this paper, we have learned Gaussian mixture model to the pixels of an image. The parameters of the model have estimated by EM-algorithm. In addition pixel labeling corresponded to each pixel of true image is made by Bayes rule. In fact, ...
متن کاملModes or models: a critique on independent component analysis for fMRI.
are only uncorrelated). More importantly , ICA does this in a fashion that renders the expression of the components non-Gaussian. In the implementation proposed by McKeown et al. these distributions are super-Gaussian or 'sparse'. This simply means that things happen infrequently. Why is a 'sparse', or more generally a non-Gaussian, distribution interesting? The answer to this question is simpl...
متن کاملA Novel Merging Algorithm in Gaussian Mixture Probability Hypothesis Density Filter for Close Proximity Targets Tracking ⋆
This paper proposes a novel merging algorithm in Gaussian mixture probability hypothesis density filter to track close proximity targets. The proposed algorithm is added after GM-PHD recursion, in a condition that more than one target has the same state. The weights of Gaussian components decide whether the components can be utilized to extract states, and the means and covariances of Gaussian ...
متن کاملImage Segmentation using Gaussian Mixture Model
Abstract: Stochastic models such as mixture models, graphical models, Markov random fields and hidden Markov models have key role in probabilistic data analysis. In this paper, we used Gaussian mixture model to the pixels of an image. The parameters of the model were estimated by EM-algorithm. In addition pixel labeling corresponded to each pixel of true image was made by Bayes rule. In fact,...
متن کامل